Gradient Descent Batch Clustering for Image Classification

نویسندگان

چکیده

The batch clustering algorithm for classification application requires the initial parameters and also has a drifting phenomenon stochastic process. are critical to con-verge partial optimum. in original still space be improved thus speed up convergence based on parameters. This paper proposes an unsupervised method by addressing these two issues. Firstly, estimation been given preliminary with hierarchical manner of principal component analysis (PCA). nonlinear have estimated mathematical connection between PCA clusters membership. With parameters, issue is addressed combing gradient descent auxiliary objective refine efficiency process proved relationship quadratic functions followed justification. In addition, effectiveness proposed validated statistical F measure application. validation results show that significantly trade-off accuracy comparison algorithms under mean squared error (MSE) criterion.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Cost-Sensitive Approach to Batch Size Adaptation for Gradient Descent

In this paper we propose a novel approach to automatically determine the batch size in stochastic gradient descent methods. The choice of the batch size induces a trade-off between the accuracy of the gradient estimate and the cost in terms of samples of each update. We propose to determine the batch size by optimizing the ratio between a lower bound to a linear or quadratic Taylor approximatio...

متن کامل

The general inefficiency of batch training for gradient descent learning

Gradient descent training of neural networks can be done in either a batch or on-line manner. A widely held myth in the neural network community is that batch training is as fast or faster and/or more 'correct' than on-line training because it supposedly uses a better approximation of the true gradient for its weight updates. This paper explains why batch training is almost always slower than o...

متن کامل

Solving Prediction Games with Parallel Batch Gradient Descent

Learning problems in which an adversary can perturb instances at application time can be modeled as games with datadependent cost functions. In an equilibrium point, the learner’s model parameters are the optimal reaction to the data generator’s perturbation, and vice versa. Finding an equilibrium point requires the solution of a difficult optimization problem for which both, the learner’s mode...

متن کامل

Fully Distributed Privacy Preserving Mini-batch Gradient Descent Learning

In fully distributed machine learning, privacy and security are important issues. These issues are often dealt with using secure multiparty computation (MPC). However, in our application domain, known MPC algorithms are not scalable or not robust enough. We propose a light-weight protocol to quickly and securely compute the sum of the inputs of a subset of participants assuming a semi-honest ad...

متن کامل

Robust Method for E-Maximization and Hierarchical Clustering of Image Classification

We developed a new semi-supervised EM-like algorithm that is given the set of objects present in eachtraining image, but does not know which regions correspond to which objects. We have tested thealgorithm on a dataset of 860 hand-labeled color images using only color and texture features, and theresults show that our EM variant is able to break the symmetry in the initial solution. We compared...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Image Analysis & Stereology

سال: 2023

ISSN: ['1854-5165', '1580-3139']

DOI: https://doi.org/10.5566/ias.2905